专利摘要:
driving assistance device and driving assistance method. when a rotational state detection unit 12 detects that a host vehicle is in a rotating state, a detection region modifying unit 13 changes a position of a detection region with respect to the host vehicle, or a shape or an area of the detection region (detection regions raa, rab) based on the state of rotation of the host vehicle ca. for example, the smaller the rotation radius of the host vehicle ca, the shorter the sensing region modifying unit 13 adjusts the region length of the sensing region. in this way, the region closest to the host vehicle ca is adjusted, to a limited extent, as the detection regions raa, rba.
公开号:BR112014001155B1
申请号:R112014001155-9
申请日:2012-07-17
公开日:2021-06-01
发明作者:Osamu Fukata;Yasuhisa Hayakawa;Chikao Tsuchiya;Masanori Furuya
申请人:Nissan Motor Co., Ltd;
IPC主号:
专利说明:

Field of Invention
[001] The present invention relates to a DRIVING ASSISTANCE device and a DRIVING ASSISTANCE method. prior technique
[002] A DRIVING ASSISTANCE device is known to perform DRIVING ASSISTANCE by detecting a solid object around the vehicle. For example, this type of driving assistance device processes the captured image sent in chronological order from an imaging medium to detect the solid object.
[003] For example, patent document 1 presents an obstacle detection device capable of performing the detection of solid objects. The obstacle detection device is provided with a real camera that photographs the surroundings of the vehicle, and an obstacle detection means for detecting a solid object using the image of the surroundings of the vehicle entered from the real camera. The obstacle detection means converts the viewpoint image of the vehicle's surroundings from the real camera, and detects the solid object using a difference image that corresponds to the difference between two chronologically different aerial view images. Prior Art Patent Document Patent Document 1: Unexamined Japanese Patent Publication No. 2008-227646 Summary of the inventionProblem to be solved by the invention
[004] However, as per the technique presented in patent document 1, if the difference between the two chronologically different aerial view images is used in solid object detection, when the vehicle is turning, for example, the surface display The road is falsely recognized as a solid object, and possibly leads to a deterioration in detection accuracy due to the fact that the change in vehicle behavior is included in the difference image as noise.
[005] In view of this situation, the present invention aims to suppress the deterioration of detection accuracy attributable to the state of rotation of the vehicle when it detects solid objects.
[006] To address this problem, the present invention has a rotation state detection means for detecting the rotation state of a host vehicle. When the rotation state detection means detects that the host vehicle is in a rotation state, a detection region modifying means changes the position of a detection region with respect to the host vehicle, or the shape or area of the detection region based on the rotation state of the host vehicle. Effects of the invention
[007] According to the present invention, if the host vehicle is in the rotation state, to avoid false recognition of a solid object, a region that tends to generate a false recognition of a solid object can be excluded when performing the recognition by changing the position of the detection region in relation to the host vehicle, or by changing the shape or area of the detection region based on the rotational state of the host vehicle, and thus false recognition of solid objects can be controlled. Hereby, it is possible to suppress the deterioration of detection accuracy attributable to the state of rotation of the vehicle when detecting solid objects. Brief description of the drawings
[008] Figure 1 is an explanatory diagram that schematically illustrates a configuration of a driving assistance device.
[009] Figure 2 is a block diagram that functionally illustrates the configuration of a driving assistance device according to a first mode.
[010] Figure 3 is a flowchart that illustrates a series of operating procedures performed by the DRIVING ASSISTANCE device.
[011] Figure 4 is a flowchart that details the procedures for solid object detection used in step 6.
[012] Figure 5 is a diagram to describe the detection regions Ra, Rb.
[013] Figure 6 is a diagram to describe the state where the shape of the detection regions Raa, Rba is modified during the state of rotation.
[014] Figure 7 is a diagram to describe the state where the shape of the detection regions Ra, Rb is not modified during the state of rotation.
[015] Figure 8 is a diagram to describe the state where the shape of the detection regions Rab, Rbb is modified during a state of rotation.
[016] Figure 9 is a diagram to describe the state where the shape of the detection regions Rac, Rbc is modified during the state of rotation.
[017] Figure 10 is a block diagram that functionally illustrates the configuration of a driving assistance device according to a fourth modality.
[018] Figure 11 illustrates an example of detection regions Raa, Rba in a rodeo (Example 1).
[019] Figure 12 illustrates an example of detection regions Raa, Rba in a rodeo (Example 2).
[020] Figure 13 illustrates an example of the relationship between the return speed of the return of the detection regions Raa, Rba to an initial state, and the return amount of the steering wheel.
[021] Figure 14 is a diagram to describe the method used for detecting the amount of return of the steering wheel.
[022] Figure 15 is a diagram to describe another modality where the detection region is modified during the rotation state. Preferred Modalities of the Invention (First Mode)
[023] Figure 1 is an explanatory diagram that schematically illustrates the configuration of a driving assistance device according to the present modality. The driving assistance device detects a rear vehicle approaching from behind a vehicle (host vehicle), and is mainly configured by a controller 10.
[024] Controller 10 works to comprehensively control the entire system and, for example, controller 10 can utilize a microcomputer primarily configured with a CPU, a ROM, a RAM and an I/O interface. Controller 10 performs various computations necessary for driving assistance according to the control programs stored in ROM. Controller 10 receives input information from a camera 1, a wheel speed sensor 2 and a steering angle sensor 3.
[025] Camera 1 can be positioned, for example, at a height h above a road surface and placed at the rear of the host vehicle Ca at an angle (high angle) θ formed by a horizontal plane at the height of the camera and the camera center; camera 1 may have a built-in image sensor (eg a CCD or a CMOS sensor). As illustrated in Figure 1, camera 1 periodically captures a landscape that includes a predetermined detection region (described later) and thus sends a captured image (image medium) in chronological order.
[026] Wheel speed sensor 2 is provided on each of the front, rear, left and right wheels and detects the rotational speed of the wheel. The wheel speed sensor 2 detects the vehicle speed equivalent to the host vehicle Ca through the rotational speed of each of the wheels. Steering angle sensor 3 consists, for example, of an angle sensor installed on the steering column or close to the steering wheel, and detects the angle of rotation of the steering axle as the steering angle.
[027] Figure 2 is a block diagram that functionally illustrates the configuration of the driving assistance device according to the present modality. In the DRIVING ASSIST device, the controller 10 performs a predetermined process on the captured images sent in chronological order from the camera 1, and detects the following vehicle based on the image obtained from the processing. If taking control of the functions of the driving assistance device, the controller 10 will have a viewpoint conversion unit 11, a rotation state detection unit 12, a detection region modification unit 13 and a detection unit of solid object 14.
[028] The viewpoint conversion unit 11 converts a captured image sent from the camera 1 into an aerial view image (high angle image) through viewpoint conversion. The aerial view image is a conversion of the image captured from the real camera 1 into a virtual image captured from a point of view (virtual point of view) from a virtual camera. More specifically, the aerial view image corresponds to an image where the image captured from the real camera 1 has the viewpoint converted to an image looking down on the ground from a point on the map of a predetermined height (in in other words, an image in which the captured image is projected onto the road surface).
[029] The rotation state detection unit 12 detects the rotation state of the host vehicle Ca which includes the rotation radius of the host vehicle Ca, and the rotation direction, based on the detection information from the speed sensor of wheel 2 and of the steering angle sensor 3. Additionally, the rotation state detection unit 12 predicts the rotation state of the host vehicle Ca which includes the rotation radius and the rotation direction of the host vehicle Ca. , the rotation state detection unit 12 determines whether or not the host vehicle Ca is in the rotation state according to the detection result or a prediction result.
[030] The detection region modification unit 13 modifies the shape of a detection region based on the state of rotation detected by the state of rotation detection unit 12. Techniques for modifying the shape of a detection region will be described. posteriorly.
[031] The solid object detection unit 14 detects a solid object based on two successive aerial view images in chronological order. Here, "two successive aerial view images in chronological order" refers to two aerial view images taken at different times of photography; for example, this corresponds to an aerial view image based on the image captured at a time t1 (present) (referred to below as the "present aerial view image"), and an aerial view image based on an image captured at a time t2 (t1 - Δt (Δt: the output frequency of an image); referred to below as "past aerial view image").
[032] More specifically, the solid object detection unit 14 first aligns the two successive aerial view images in chronological order, in other words, the solid object detection unit 14 aligns the present aerial view image, and the image of last aerial view. Next, the solid object detection unit 14 obtains a difference image between the two aerial view images. The solid object detection unit 14 then detects a solid object based on the computed difference image (solid object detection means). In this case, the solid object detection unit 14 detects the solid object within the left rear and right rear detection regions of the host vehicle Ca, and more specifically, the solid object detection unit 14 will detect the solid object within of a region that corresponds to an adjacent traffic lane as the after vehicle (an adjacent vehicle).
[033] Figures 3 and 4 are flowcharts that illustrate a series of operating procedures performed by the DRIVING ASSISTANCE device according to the present modality. The processes illustrated in the flowcharts are performed by the controller 10 at predetermined times.
[034] First, in step 1 (S1), when the viewpoint conversion unit 11 acquires a captured image from the camera 1, the viewpoint conversion unit 11 performs a viewpoint conversion on the and generates an aerial view image.
[035] In step 2 (S2), the rotation state detection unit 12 predicts whether or not the host vehicle Ca will be in a rotation state after a predetermined time (predicts rotation state). More specifically, the rotation state detection unit 12 indicates the image captured from the camera 1, detects a traffic lane (eg the white line) over the road and calculates a lane curvature as a parameter representing the shape of the road. The rotational state detection unit 12 predicts the shape of the road in front of the host vehicle Ca, and more specifically, the rotational state of the host vehicle Ca to the point after the predetermined time, based on the calculated lane curvature, and at the vehicle speed obtained from the wheel speed sensor 2.
[036] In step 3 (S3), the rotation state detection unit 12 determines whether or not the host vehicle Ca is in a rotation state. More specifically, the rotation state detection unit 12 indicates the vehicle speed obtained from the wheel speed sensor 2, and the steering angle obtained from the steering angle sensor 3, and computes the rotation radius present of the host vehicle Ca based on the following formula.[Formula 1]p= (1+ KV)(nL/Φ)
[037] In this formula, p is the rotation radius, k is the stability factor, V is the vehicle speed, L is the distance between the axles, n is the steering wheel gear ratio and Φ is the angle steering.
[038] Finally, when the present rotation radius computed based on formula 1, and the rotation radius predicted in step 2 is not less than a predetermined threshold, the rotation state detection unit 12 determines that the host vehicle Ca is in the state of rotation.
[039] If the result in step 3 is determined as affirmative, in other words, if the host vehicle Ca is in the rotation state, processing continues in step 4 (S4). Whereas, if the result in step 3 is determined to be negative, in other words, if the host vehicle Ca is not in the rotating state, processing continues in step 6 (S6).
[040] In step 4, the present rotation radius is finally determined based on the rotation radius computed in the previously described steps 2, and 3. More specifically, in addition to the reference to the time information, the state detection unit of rotation 12 predicts the present radius of rotation based on the predicted radius of rotation until after the predicted predetermined time of step 2. The rotation state detection unit 12 compares the predicted present radius of rotation with the radius of rotation calculated in step 3 , and calculates a probability (in other words, a degree of plausibility) for the predicted present radius of rotation. When the probability is not less than a predetermined decision value, the rotation state detection unit 12 specifies the predicted rotation radius after the predetermined time predicted in step 2 as the final rotation radius; whereas, when the probability is less than the predetermined decision value, the rotation state detection unit 12 finally determines the rotation radius calculated in step 3 as the final rotation radius.
[041] In step 5, the detection region modification unit 13 modifies the shape of the detection region based on the final rotation radius specified in step 4. As illustrated in Figure 5, the detection regions are square regions Ra, Rb which have a predetermined region length in the direction of travel FD, and which has symmetry with respect to the host vehicle Ca and a predetermined region width in a direction orthogonal to the direction of travel FD; the detection regions are set to extend behind the vehicle from the reference positions Pa, Pb which are set to the left rear and right rear of the host vehicle Ca as the origin points. When the host vehicle Ca is traveling in a straight line, the detection regions are adjusted to be positioned and sized so that they lie in adjacent traffic lanes to the rear left and rear right of the host vehicle Ca; in this way, the reference positions, the region lengths and the region widths are preliminarily defined according to them.
[042] Incidentally, when the rotation state detection unit 12 determines that the vehicle is in the rotation state, as illustrated in Figure 6, the detection region modification unit 13 modifies the region length of the detection regions ( detection regions Raa, Rba) to be shorter in the displacement direction than the detection regions Ra, Rb, which is used as a reference (refer to Figure 5). Additionally, the modified detection regions Raa, Rba are adjusted so that the detection regions Raa, Rba that correspond to the inner side of the curve have a greater degree of modification than the detection regions Raa, Rba that correspond to the outer side of the curve. curve. By this, detection regions Ra, Rb that are symmetrical with respect to the host vehicle Ca are modified to an asymmetrical shape (detection regions Raa, Rba).
[043] The degree of modification of each of the detection regions Raa, Rba is determined according to the radius of rotation, that is, the degree of modification is determined according to the radius of rotation to exclude the detection region that can generate false recognition of a solid object; for example, a relationship is established such that the smaller the radius of rotation, the relatively greater will be the degree of modification of each of the detection regions Raa, Rba. However, as described above, the relationship is established so that the degree of modification differs for the detection regions Raa, Rba on the inner side of the curve and the detection regions Raa, Rba on the outer side of the curve even in the same state of rotation.
[044] For example, the detection region modification unit 13 can maintain a map or an arithmetic expression or a correspondence relationship between the radius of rotation, and the detection regions Raa, Rba modified according to the radius of rotation . In this way, the detection region modification unit 13 can adjust the modified detection regions Raa, Rba on the basis of the final rotation radius specified in step 4.
[045] In step 6 (S6), the solid object detection unit 14 detects a solid object. Figure 4 is a flowchart detailing the procedures for solid object detection used in step 6.
[046] First, in step 60 (S60), the solid object detection unit 14 performs an alignment using the present aerial view image and the past aerial view image. Here, "alignment" refers to processing a position in the aerial view image (past aerial view image) to align with the other aerial view image (present aerial view image) so that the locations match between the two successive aerial view images in chronological order for a fixed object to reference in the images, such as the white line on the road surface, a traffic sign, or a piece of dirt. Although several techniques are available for performing alignment in this modality, in order to reduce the number of computations, the alignment technique used involves calculating the amount of movement of the host vehicle Ca during an image acquisition cycle of camera 1 to starting from the speed of the vehicle, and compensating one of the aerial view images for the amount of movement. If accuracy is a priority, alignment can be performed between the aerial view images so that the fixed objects referenced in them match using a comparison process, and the like.
[047] In step 61 (S61), the solid object detection unit 14 generates a difference image. More specifically, the solid object detection unit 14 calculates a difference between the common parts of the aligned present aerial view image and the past aerial view image, and produces the computation results as the difference image. While the difference can be computed using a method based on the absolute difference in brightness values, the difference can also be computed by performing edge point detection using a Laplacian filter, and so on. , and calculating the difference based on the positions of the edge points.
[048] In step 62 (S62), the solid object detection unit 14 performs threshold processing. More specifically, the solid object detection unit 14 converts the difference image to binary using a predetermined threshold so that a region no smaller than the threshold specifies a solid object. Additionally, the solid object detection unit 14 detects the solid object within the detection regions Ra, Rb, or the modified detection regions Raa, Rba as an adjacent vehicle (more specifically, a vehicle traveling side by side, the which is a rear vehicle traveling on the adjacent traffic lane).
[049] In this way, in the first mode, the detection region modification unit 13 compares the case where the rotation state detection unit 12 determines that the host vehicle Ca is in the rotation state and the case where the rotational state unit rotation state detection 12 determines that the host vehicle Ca is not in the rotation state (Figure 5), and modifies the shape of the detection region to exclude a region that could generate false recognition of a solid object. In other words, the sensing region modification unit 13 modifies the shape and area of the sensing regions to reduce the region length of the sensing region in the direction of displacement, and can thus exclude a region that tends to generate a false recognition of a solid object from the detection regions.
[050] Solid object detection based on captured images taken from the rear of the vehicle is such that the farther a solid object is from the host vehicle Ca, the more the difference image is affected by the noise attributable to the rotating behavior of the vehicle Ca; hereby, there is a disadvantage that the solid object can be falsely recognized. At this time, in accordance with the present embodiment, when the host vehicle Ca is in the rotating state, modifying the shape and area of the detection region to exclude detection regions that can generate false recognition of a solid object can, thus, exclude detection regions that can generate a false recognition as needed. In this way, it is possible to suppress the deterioration of detection accuracy attributable to the state of rotation of the host vehicle Ca.
[051] Additionally, the detection region modification unit 13 modifies the region length in the direction of vehicle displacement from the detection region on the inner side of the curve (detection regions Raa, Rba in Figure 6) according to the radius of rotation of the host vehicle Ca. In the present embodiment, the smaller the rotation radius of the host vehicle Ca, the shorter the sensing region modifying unit 13 adjusts the region length of the sensing region. In this way, the region closest to the host vehicle Ca is fitted, to a limited extent, as the detection regions Raa, Rba.
[052] According to this configuration, the region furthest away from the host vehicle Ca can be excluded from the detection region and, therefore, it is possible to suppress the deterioration of detection accuracy attributable to the rotational state of the host vehicle Ca. this type of modification to the shape of the sensing region is sufficient if performed in at least the sensing region on the inner side of the curve.
[053] As illustrated in Figure 7, if while the host vehicle Ca is turning the detection region (Ra, Rb regions) is adjusted in the same way as when the host vehicle Ca is not turning, the detection regions Ra, Rb will include a lane away from the adjacent traffic lane, and this possibly becomes the primary factor for the deterioration in detection accuracy. However, according to the present embodiment, modifying the region length of the detection region in the displacement direction FD of the vehicle can thereby extend the modified detection regions Raa, Rba to a lane corresponding to the traffic lane adjacent. In this way, it is possible to suppress the deterioration of detection accuracy attributable to the state of rotation of the host vehicle Ca.
[054] Additionally, the detection region modification unit 13 modifies the shape and area of the individual detection regions so that the degree of modification to the region length of the detection regions Raa, Rba corresponding to the inner side of the direction of rotation is greater than the region length of detection regions Raa, Rba that correspond to the outer side of the direction of rotation.
[055] As illustrated in Figure 7, the detection regions Ra, Rb that correspond to the inner side in the direction of the curve have a greater number of regions that are further away from the adjacent traffic lane. Therefore, by ensuring that the degree of modification for the region length in each of the outer side and inner side of the rotation direction is different, thus, one can properly adjust the detection regions Raa, Rba. In this way, the necessary solid object can be properly detected while suppressing false solid object detection.
[056] Additionally, in the present embodiment, the rotation state detection unit 12 has a rotation state prediction means for predicting the rotation state of a vehicle according to a function for the same, while the rotational status modification unit detection region 13 modifies the shape and area of the detection regions when the rotation state prediction means predicts the rotation state of the host vehicle Ca. In this way, the necessary solid object can be adequately detected while adequately suppressing the false detection of solid objects.
[057] According to this configuration, the detection regions can be modified in addition to the anticipation of the rotation state and, therefore, it is possible to modify the detection regions at the appropriate time.
[058] When the rotation state is predicted and the shape of the sensing regions is to be modified accordingly, the sensing region modifying unit 13 can perform the modification immediately, whereas, on the other hand, when goes from the rotating state to the non-rotating state and the sensing regions must be returned to the initial state (reference state), the sensing region modifying unit 13 can perform the modification slowly. Thus, situations where noise is extracted in the difference image due to the fact that the rotation state of the host vehicle can be suppressed and, therefore, false detection of the solid object can be suppressed. Additionally, this type of control method is particularly effective in the case where the state of rotation of the host vehicle is induced by the host vehicle Ca to change traffic lanes. In this case, it is preferred that the controller 10 is provided with functional elements such as a lane change intention detection means for detecting the intention to change traffic lanes; the technique described above can be adopted when the lane change intention detection means detects the intention to change traffic lanes, and the vehicle goes from the rotating state to the non-rotating state, and the detection region is returned to the initial state.
[059] In addition, the detection region modification unit 13 can modify the shape of the detection regions according to a variation in the longitudinal acceleration of the host vehicle Ca. The longitudinal acceleration of the host vehicle Ca also tends to be extracted in the image of difference as noise attributable to vehicle behavior Ca; this thus triggers the possibility of deterioration in detection accuracy. Therefore, taking into account the variation in longitudinal acceleration and modifying the shape of the detection region, it can thus suppress the deterioration of detection accuracy attributable to the vehicle's rotational state. (Second mode)
[060] Figure 8 is an explanatory diagram that schematically illustrates the modification of the shape of the detection regions Rab, Rbb according to the second modality. A vehicle driving assistance device according to the second modality will be described below. The distinguishing feature between the vehicle DRIVING ASSIST device according to the second embodiment, and the first embodiment is the technique that the detection region modification unit 13 uses to modify the detection regions. Features that are similar to the first modality will be omitted from this explanation and the following will mainly contain an explanation of the distinct feature.
[061] In the present embodiment, the detection regions are square regions that have a predetermined region length in the displacement direction FD, and a predetermined region width in a direction orthogonal to the displacement direction FD; each of the reference positions Pa, Pb are respectively set to the left rear and the right rear of the host vehicle Ca, and the sensing regions are set to extend backwards with the reference positions as an origin point.
[062] In a scene where the rotation state detection unit 12 determines that the vehicle is in the rotation state, the detection region modification unit 13 sets the detection regions Rab, Rbb into an offset position as illustrated. in Figure 8, versus the detection regions Ra, Rb (with reference to Figure 5) that are used as a reference. More specifically, the detection region modification unit 13 sets the detection regions Rab, Rbb into a position rotated and moved in a reverse direction to the rotation direction of the host vehicle Ca with respect to the detection regions Ra, Rb (with reference to to Figure 5) which are used as a reference. For example, as illustrated in Figure 8, when the host vehicle Ca is turning to the right direction, the sensing region modification unit 13 adjusts the sensing regions Rab, Rbb to a position rotated and moved to the left relative to to the detection regions Ra, Rb illustrated, for example, in Figure 7. In this way, the rotation of the detection regions Rab, Rbb in the opposite direction to the rotation direction of the host vehicle Ca can thus rotate and move the detection regions Rab, Rbb along the shape of the road, as illustrated in Figure 8. In addition, between the detection regions Rab, Rbb, the detection region modification unit 13 modifies the shape of the individual detection regions Rab, Rbb, so that the degree of modification for the detection regions Rab, Rbb corresponding to the inner side of the direction of rotation is greater than the degree of modification for the detection regions Rab, Rbb corresponding to the outer side of the direction of rotation. For example, in the example illustrated in Figure 8, let θbb be the angle of rotation of the detection region Rbb with respect to a centerline of the host vehicle Ca in the direction of displacement FD, and let θab be the angle of rotation of the detection region Rab at relative to the centerline L of the host vehicle Ca in the displacement direction FD, then the sensing region modification unit 13 will rotate and move the sensing regions Rab, Rbb so that the rotation angle θab of the sensing region Rab that corresponds to the inner side of the rotation direction is greater than the rotation angle θbb of the detection region Rbb that corresponds to the outer side of the rotation direction.
[063] The degree of modification for each of the detection regions Rba, Rbb is determined according to the rotation radius of the host vehicle Ca during the rotation state of the host vehicle Ca to follow the shape of the road. For example, the detection region modification unit 13 will adjust the detection regions Rab, Rbb so that the smaller the rotation radius of the host vehicle Ca the greater the rotation angle (θab, θbb) of the detection regions Rab, Rbb. Thus, as described above, the degree of modification differs between the detection regions Rab, Rbb on the inner side of the direction of rotation, and the corresponding detection regions Rab, Rbb on the outer side of it, even in the same state of rotation. .
[064] For example, the detection region modification unit 13 can maintain a map or an arithmetic expression for a correspondence relationship between the radius of rotation, and the detection regions Rab, Rbb modified according to the radius of rotation . The detection region modification unit 13 modifies the detection regions Rab, Rbb based on the final rotation radius specified in step 4.
[065] In this way, in the present embodiment, the detection region modifying unit 13 rotates and moves the position (detection regions Rab, Rbb in Figure 8) of the detection region according to the rotation radius of the host vehicle Ca .
[066] According to this configuration, the movement, or more specifically, the rotation of the sensing region in the direction of travel of the vehicle FD to follow the shape of the road, can thus extend the sensing regions moved Rab, Rbb to include a lane that corresponds to an adjacent traffic lane. In this way, it is possible to suppress false detection of solid objects attributable to the state of rotation of the host vehicle Ca.(Third mode)
[067] Figure 9 is an explanatory diagram that schematically illustrates the modification of the shape of the detection regions Rac, Rbc according to the third modality. A vehicle driving assistance device according to the third modality will be described below. The distinguishing feature between the vehicle DRIVING ASSIST device according to the third embodiment and the first embodiment is the technique that the detection region modification unit 13 uses to modify the detection regions. Features that are similar to the first modality will be omitted from this explanation and the following will mainly contain an explanation of the distinct feature.
[068] In the third mode, when the rotation state detection unit 12 determines that the vehicle is in the rotation state, as illustrated in Figure 9, the position of the detection regions Rac, Rbc that correspond to the inner side of the direction of rotation is moved and oriented towards the inner side of the rotation direction of the host vehicle Ca. For example, in the example illustrated in Figure 9, when the rotation state detection unit 12 determines that the vehicle is in the rotation state, the sensing region modifying unit 13 moves the position of sensing region Rac corresponding to the inner side of the direction of rotation in a direction away from the centerline L in the displacement direction of the host vehicle Ca, in other words, the modifying unit of sensing region 13 moves the position of sensing region Rac in one direction such that a distance D is greater from the centerline L in the displacement direction FD of the host vehicle Ca to the Rac detection region.
[069] Additionally, while the host vehicle Ca is in the rotation state, the detection region modification unit 13 adjusts the position of the detection region Rac that corresponds to the inner side of the rotation direction based on the rotation radius of the host vehicle Ca. More specifically, the detection region modification unit 13 adjusts the position of the detection regions Rac, Rbc that correspond to the inner side of the rotation direction so that the smaller the rotation radius of the host vehicle Ca, the greater the distance D from the centerline L in the direction of displacement FD of the host vehicle Ca to the detection regions Rac, Rbc that correspond to the inner side of the direction of rotation; and on the other hand, the detection region modification unit 13 adjusts the position of the detection regions Rac, Rbc corresponding to the inner side of the rotation direction so that the larger the rotation radius of the host vehicle Ca, the smaller the distance. D from the centerline L in the direction of displacement FD of the host vehicle Ca to the detection regions Rac, Rbc that correspond to the inner side of the direction of rotation.
[070] For example, the detection region modification unit 13 can maintain a map or an arithmetic expression or a correspondence relationship between the radius of rotation and the detection regions Rac, Rbc modified according to the radius of rotation. The detection region modification unit 13 modifies the detection regions Rac, Rbc based on the final rotation radius specified in step 4.
[071] Furthermore, the device can be configured so that when moving the position of the detection regions Rac, Rbc that correspond to the inner side of the direction of rotation in a direction away from the centerline L in the direction of displacement FD of the host vehicle Ca, the position of the detection regions Rac, Rbc corresponding to the inner side of the rotation direction is moved in the vehicle width direction, and the position of the detection regions Rac, Rbc corresponding to the inner side of the rotation direction is moved in the direction of travel of the host vehicle Ca so that the detection regions Rac, Rbc are not fitted within the traffic lane on which the host vehicle is traveling, or so that the detection regions Rac, Rbc are not. within the two traffic lanes adjacent to the next adjacent traffic lane with respect to the host vehicle displacement traffic lane Ca.
[072] As described above, according to the present modality, in addition to the effects of the first modality, the movement of the detection regions Rac, Rbc that correspond to the inner side of the rotation direction of the host vehicle Ca, can thus provide advantage of effectively inhibiting the detection regions Rac, Rbc that correspond to the inner side of the host vehicle's direction of rotation from being fitted within the traffic lane in which the host vehicle Ca is traveling, and can thus suppress that the displacement of the rear vehicle in the traffic lane of the host vehicle Ca is falsely recognized as an adjacent vehicle displacement in the traffic lane adjacent to the host vehicle Ca. (Fourth mode)
[073] Figure 10 illustrates a vehicle DRIVING ASSISTANCE device according to the fourth modality. The distinguishing feature between the DRIVER ASSIST device according to the fourth embodiment, and the first embodiment is the technique that the rotational state detection unit 12 uses to detect the rotational state. Features that are similar to the first modality will be omitted from this explanation and the following will mainly contain an explanation of the distinct feature.
[074] More specifically, the rotation state detection unit 12 can read information from a state detection unit 5, a camera 6, and a navigation system 7. The state detection unit 5 is configured by various sensors for detecting respectively the operating state of the accelerator pedal, the brake pedal, and driver-initiated indicators, and the state of the vehicle, such as the rate of deviation or lateral acceleration. Additionally, a camera 6 is placed at the front of the host vehicle Ca; camera 1 periodically photographs the scene in the direction of displacement FD of the host vehicle Ca, and thus sends a captured image (image medium) in chronological order. The navigation system 7 stores the map information, where the road information is linked to the position information, and acquires the position of the host vehicle Ca from detection by means of a GPS sensor, to thus display the position present of the host vehicle Ca in the map information, and to provide route guidance to a waypoint.
[075] With this type of setting, the rotation state detection unit 12 in the first mode will predict the shape of the road using the images taken from behind the vehicle by the camera 1. However, the state detection unit rotation 12 can use the images of the front of the vehicle taken by camera 6 to recognize a traffic lane and thus predict the rotation status.
[076] Additionally, the rotation state detection unit can predict the shape of the road from the operating states initiated by the driver (for example, the accelerator pedal, the brake pedal, and the indicators, and the wheel of direction, and so on) as detected by the state detection unit 5. Furthermore, the rotation state detection unit 12 predicts the rotation state according to the map information or the present position information of the host vehicle Ca from the navigation system 7, and so on.
[077] In the modality described above, the rotation state detection unit 12 computes the rotation radius of the host vehicle Ca according to formula 1 based on the speed of the host vehicle Ca, the steering angle of the host vehicle Ca and various elements related to the vehicle. However, the rotation state detection unit 12 can compute the radius of rotation of the vehicle based on the difference in a wheel speed of the wheels provided to the host vehicle Ca, and various elements related to the vehicle, or it can compute the radius of rotation of the host vehicle Ca based on the images captured from camera 1, or camera 6. Finally, the rotation state detection unit 12 can compute the rotation radius of the host vehicle Ca based on the drift rate used as the vehicle state obtained from the state detection unit 5, or the lateral acceleration, and the vehicle speed, or the rotation state detection unit 12 can compute the rotation radius of the host vehicle Ca based on the information of map obtained from the navigation system 7 and at the position of the host vehicle Ca.
[078] According to such modality, several techniques can be used to predict the rotation state, and to compute the rotation radius of the host vehicle Ca. In this way, the rotation state can be accurately predicted and the rotation radius of the host vehicle Ca can be precisely detected. As a result of this, the shape of the detection region can be suitably modified and, in this way, it is possible to effectively suppress false detection of solid objects. (Fifth modality)
[079] Figures 11 and 12 are explanatory diagrams that schematically illustrate the modification of the shape of the detection regions according to the fifth modality. A vehicle DRIVING ASSIST device according to the fifth modality will be described below. The distinguishing feature between the vehicle DRIVING ASSIST device according to the fifth embodiment and the first embodiment is the technique that the detection region modification unit 13 uses to modify the detection region. Features that are similar to the first modality will be omitted from this explanation and the following will mainly contain an explanation of the distinct feature. Additionally, as illustrated in Figure 11 and Figure 12, the fifth embodiment describes an example where the host vehicle Ca is traveling in a roundabout (a traffic circle or roundabout).
[080] For example, as illustrated in Figure 11, in the situation where host vehicle Ca enters a rodeo, and is turning at the rodeo (for example, the situation where host vehicle Ca is in position P1 illustrated in Figure 11), according to the first embodiment, the detection region modification unit 13 modifies the region length of the detection region (detection regions Raa, Rba) so that the region length is shorter in the displacement direction FD than the regions of detection Ra, Rb that become a reference (with reference to Figure 5). Additionally, in this case, as in the first embodiment, the detection region modification unit 13 adjusts the detection regions so that the degree of modification for the detection region Raa corresponding to the inner side of the curve is greater than the degree of modification to the detection region Rba that corresponds to the outside of the curve.
[081] Additionally, as illustrated in Figure 11, in the situation where host vehicle Ca is turning in the dodge (for example, the situation where host vehicle Ca is in position P1 illustrated in Figure 11), although the steering wheel is turned towards the right direction, then, in the situation where host vehicle Ca is starting to proceed out of the loop (for example, the situation where host vehicle Ca moves from position P1 to position P2 illustrated in Figure 11 ), the steering wheel is turned to the left. In this way, the host vehicle Ca is in the left-hand rotation state, and the detection region modifying unit 13 changes the shape of the detection region Rba so that the detection length is shorter in the displacement direction FD of the detection region Rba on the inner side of the curve.
[082] Also, in the situation where host vehicle Ca is proceeding out of the dodge (for example, the situation where host vehicle Ca moves from position P1 to position P2 illustrated in Figure 11) by turning the wheel of steering from the right direction to the left direction, thus makes the rotation state detection unit 12 detect the steering wheel return operation, and detect a steering wheel return amount due to to the steering wheel return operation. In this way, when the steering wheel feedback amount is detected, the detection region modification unit 13 starts the process to return the detection regions Raa, Rba to an initial state (Ra, Rb illustrated in Figure 5).
[083] For example, in the example illustrated in Figure 11, when the host vehicle Ca moves from position P1 to position P2, the steering wheel is turned from the right direction to the left direction, and the return amount towards the left direction is detected for the steering wheel. In this way, the sensing region modifying unit 13 starts the process to return to the initial state Ra to the sensing region Raa which corresponds to the inner side of the direction of rotation when turning in the roundabout. In other words, when the amount of return towards the left direction is detected for the steering wheel, the sensing region modification unit 13 gradually extends the sensing region Raa in the displacement direction FD so that the length of region of detection region Raa set to the rear right of host vehicle Ca becomes the same length as the length of region of detection region Ra in the initial state.
[084] Furthermore, in the situation where the host vehicle Ca moves from position P2 illustrated in Figure 11 towards the exit of the dodge to a position P3 illustrated in Figure 12, the amount of return in the right direction is detected to the steering wheel, which also starts the process to return the sensing region Rba set at the left rear of the host vehicle to the initial state Rb. Additionally, in the situations illustrated in Figure 11, and Figure 12, the process for returning the detection regions Raa, Rba to the initial states Ra, Rb is started for the detection region Raa set to the right rear of the host vehicle Ca before being started for the detection region Rba set to the rear left of the host vehicle Ca. Therefore, in the situation illustrated in Figure 12, the region length for the detection region Raa set to the rear right of the host vehicle Ca is set to be longer than the than the detection region Rba set to the left rear of the host vehicle Ca.
[085] Additionally, in the present embodiment, the detection region modification unit 13 finally determines a feedback speed V to return the detection region Rab, Rba to the initial states Ra, Rb based on the feedback amount of the wheel of direction. Here, Figure 13 illustrates an example of the relationship between the return speed V of the return from the detection regions Raa, Rba for the initial states Ra, Rb, and the return amount of the steering wheel Q.
[086] As illustrated in Figure 13, the greater the absolute value of the return amount of the steering wheel Q, the slower the return speed V set by the detection region modification unit 13 to return the reduced region length of the regions detection Raa, Rba for the initial states Ra, Rb; and the smaller the absolute value of the return amount of the steering wheel Q, the faster the return speed V set by the detection region modification unit 13 to return the reduced region length for the detection regions Raa, Rba for the initial states Ra, Rb. More specifically, as illustrated in Figure 13, the detection region modification unit 13 will return the detection regions Raa, Rba to the initial states Ra, Rb at a predetermined speed V1 when the absolute value of the wheel return amount of Q direction is less than a predetermined value S1; in addition, the detection region modification unit 13 will return the detection regions Raa, Rba to the initial states Ra, Rb at a predetermined speed V2 which is faster than the predetermined speed V1 when the absolute value of the amount of steering wheel return Q is not less than a predetermined value S2 which is greater than the predetermined value S1. Additionally, when the absolute value of the steering wheel return amount Q is greater than or equal to the predetermined value S1, and less than the predetermined value S2, the detection regions Raa, Rba are returned to the initial states Ra, Rb in a speed where the greater the absolute value of the steering wheel return amount Q, the slower the return speed within a range from the predetermined speed V1 to the predetermined speed of V2. Thus, in the fifth mode, the greater the absolute value of the return amount of the steering wheel Q, the longer is the time needed to return the reduced region length from the detection regions Raa, Rba to the initial states Ra, Rb, and the smaller the absolute value of the return amount of the steering wheel Q, the shorter is the time needed to return the reduced region length for the detection regions Raa, Rba to the initial states Ra, Rb.
[087] Furthermore, the detection methods used for detecting the amount of steering wheel return are not particularly limited and, in the present embodiment, the rotation state detection unit detects the amount of return of the steering wheel Q based on a variation in the steering angle detected by the steering angle sensor 3. Here, Figure 14 is a diagram to describe the method used for detecting the amount of steering wheel return. The following description of the detection method for detecting the amount of steering wheel return is made with reference to Figure 14.
[088] That is, first, the rotation state detection unit 12 processes the steering angle detected by the steering angle sensor 3 using low pass filters with different characteristics (low pass filter A and low pass filter B ). Here, as illustrated in Figure 14, low-pass filter A has high (fast) tracking (responsiveness) to the steering angle detected by steering angle sensor 3, and low-pass filter B has low (slow) efficiency ( responsiveness) in relation to the steering angle detected by the steering angle sensor 3.
[089] Taking these characteristics of these low-pass filters into account, as illustrated in Figure 14, the rotation state detection unit 12 detects the return amount of the steering wheel Q by taking the difference between the steering angle filtered with the filter low-pass A, and the direction angle filtered with the low-pass filter B at a time after a predetermined time has elapsed (eg time t2 shown in Figure 14) from the time (time t1 shown in Figure 14) when steering wheel return was performed.
[090] The detection region modification unit 13 determines whether the steering wheel return amount acquired from the rotation state detection unit 12 is a positive value or a negative value to determine the wheel return direction. steering. For example, if the unit is designed so that when the steering wheel return operation is performed towards the left direction, the steering wheel return amount is detected as a positive value, and when the steering wheel return operation of the steering wheel is run towards the right direction, the steering wheel return amount is detected as a negative value, then the detection region modification unit 13 can determine that the steering wheel is moving in the left direction when the detected steering wheel return amount is a positive value, and thus return the rear right detection region Raa to the initial state Ra.
[091] As described above, in the present embodiment, as illustrated in Figure 11 and Figure 12, in situations where the host vehicle Ca proceeds out of a detour, and the like, when a steering wheel return is performed, the regions of detection Raa, Rba with reduced region length are gradually returned to the initial states Ra, Rb based on the steering wheel feedback amount. Thus, in situations where host vehicle Ca is proceeding out of the detour, and the like, it is possible to avoid detection of a trailing vehicle traveling on the same traffic lane as host vehicle Ca in the detection regions Raa, Rab, thus effectively preventing such a trailing vehicle type from being falsely recognized as an adjacent vehicle traveling on the traffic lane adjacent to the host vehicle Ca.
[092] Additionally, according to the present modality, it is possible to effectively address the following problems. In other words, there is a problem that if the detour radius is small, and the return amount of the steering wheel Q is large, the detection regions Raa, Rba tend to be set in the traffic lane that the host vehicle Ca is on. moving and thus there is the problem of false detection of the rear vehicle displacement on the traffic lane that the host vehicle Ca is traveling on. Additionally, if the amount of return from the steering wheel Q is large, there is a tendency for the driver of the host vehicle Ca to proceed out of the detour at a relatively slower speed for safety purposes, and depending on the return speed for return the detection regions Raa, Rab to the initial states Ra, Rb, there is a case where the detection regions Raa, Rab with a reduced region length would be returned to the initial states Ra, Rb before the host vehicle Ca proceeds to out of the rodeo. Regarding this problem, as illustrated in Figure 13, in the present mode, the greater the absolute value of the return amount of the steering wheel Q, the slower is the return speed V to return the detection regions Raa, Rba to the initial states Ra, Rb, and thus it is possible to return the detection regions Raa, Rba to the initial states Ra, Rb at a suitable time that corresponds to the shape of the dodge; additionally, it is possible to effectively prevent false detection of the rear vehicle displacement on the traffic lane that the host vehicle Ca is traveling as an adjacent vehicle. Conversely, the smaller the absolute value of the steering wheel return amount Q, the faster the return speed V to return the detection regions Raa, Rba to the initial states Ra, Rb, and so after that. the host vehicle Ca has proceeded from the roundabout, since it is possible to return the detection regions Raa, Rba to the initial states Ra, Rb within a short amount of time, it is possible to detect an adjacent vehicle in the proper time.
[093] Additionally, although the fifth mode described above provides an example of a configuration where the detection region Raa, Raa with reduced region length is gradually returned to the initial states Ra, Rb based on the amount of steering wheel alternation when the host vehicle Ca proceeds out of the rodeo, the present invention is not limited to this configuration; for example, as per the second embodiment described above, the configuration can be such that when the detection regions Raa, Rba are rotated and moved in the inverse direction with respect to the rotation direction of the host vehicle Ca, the detection regions Raa, Rba rotated and moved can be gradually returned to the initial states Ra, Rb based on the amount of steering wheel alternation. Additionally, the setting for this case can also be such that the greater the absolute value of the return amount of the steering wheel Q, the slower is the return speed V to return the failed and moved detection regions Raa, Rba for the initial states Ra, Rb; and the smaller the absolute value of the return amount of the steering wheel Q, the faster is the return speed V to return the region length rotated and moved to the detection regions Raa, Rba to the initial states Ra, Rb.
[094] Finally, although the present modality provides an example of a configuration where the return process is started at the time that the return amount of the steering wheel is detected to return the detection regions Raa, Rba, without being limited to this configuration, for example, the configuration can be such that, as per the situational examples illustrated in Figure 12, the feedback process can be started at the time the steering wheel feedback amount is detected and the host vehicle Ca changes from the rotating state to the non-rotating state to return the detection regions Raa, Rba to the initial states Ra, Rb.
[095] Here ends the explanation of the DRIVING ASSISTANCE device according to the embodiments of the present invention; however, the present invention is not limited to the embodiments described above and may be modified insofar as modifications are within the scope of the invention.
[096] For example, the modality described above presents an example of a configuration where the position of the detection region in relation to the host vehicle Ca, or the shape or area of the detection region is changed when the host vehicle Ca is in a state of rotation, in order to exclude the detection regions that can generate a false recognition of a solid object; however, without being limited to this configuration, the following configurations can also be provided. For example, there may be a configuration in which, when creating the difference image, if the host vehicle Ca is in the rotation state, the suppression or prohibition of the output value for the difference in a region where the false recognition of the solid object can be generated can thus suppress false recognition of a solid object in the region which can generate a false recognition of a solid object. Additionally, there may be a configuration where, when a difference image is converted to binary with a predetermined threshold such that a region not smaller than the threshold is specified as a solid object, if the vehicle is in the rotating state, the increasing the threshold used in converting to binary the region where a false detection of an object can be generated can thus suppress a false recognition of a solid object in the region which can generate a false recognition of a solid object.
[097] In addition, as illustrated in Figure 15 (A), the modality described above presents an example of a configuration where changing the shape of the detection region to reduce the region length in the direction of displacement thus changes the area the detection region; however, without being limited to this configuration, for example, as illustrated in Figure 15(B), there may be a configuration that excludes from the regions used in generating a difference image, a region among the detection regions that can generate a false recognition of a solid object to thus alter the area of the detection region. In other words, there may be a configuration that adjusts a defect region in the detection region, as illustrated in Figure 15(B), and generates a difference image in only one target region that excludes the defect region from the detection region. Additionally, in the same way, the exclusion, from the regions used in the generation of a difference image, of a region among the detection regions that can generate a false recognition of an object, thus, alters the area of the detection regions. Finally, Figure 15 is a diagram to explain another example for changing the shape of a detection region.
[098] Additionally, the modality described above presents an example of a configuration in which the area of the detection region is changed to exclude a detection region that can generate a false detection of the solid object; however, without being limited to this configuration, for example, as illustrated in Figure 15(C), there may be a configuration where, without changing the sensing region area, only the sensing region shape is changed to exclude the detection region that can generate a false detection of a solid object. For example, in the example illustrated in Figure 15(C), the detection region Ra is narrowed on the inner side of the direction of rotation, while the front part of the detection region Ra is made to project by an amount so that the displacement traffic path of the host vehicle Ca is not included in the detection region Ra when the host vehicle Ca is turning, and thus the shape of the detection region can be changed without changing the area of the detection region.
[099] Finally, the modality described above presents an example of a configuration where the position of detection regions A1, A2 is changed to exclude a detection region that can generate a false recognition of a solid object, which is performed by moving it. whether the position of detection regions A1, A2 in the vehicle width direction, or by rotating and moving detection regions A1, A2; however, without being limited to this configuration, for example, there may be a configuration that moves the position of the detection regions A1, A2 in the displacement direction FD of the host vehicle Ca to exclude the detection region that can generate a false recognition of a solid object.Reference numbers - Camera - Wheel speed sensor - Steering angle sensor - State detection unit - Camera - Navigation system - Controller - Point of view conversion unit - State detection unit rotation- Detection region modification unit1235671011121314- Solid object detection unit
权利要求:
Claims (13)
[0001]
1. Driving assistance device, CHARACTERIZED by the fact that it comprises: a rotational state detection means for detecting the rotational state of a host vehicle, based on the steering angle of the host vehicle, a means of capturing image disposed on the host vehicle that captures an image of a predetermined swath, which includes a detection region that corresponds to adjacent lanes behind the host vehicle to the left and right in a non-rotating state, and sends a captured image to a detection means of a three-dimensional object that detects a posterior vehicle in the detection region in an aerial view image obtained by converting the field of view of the captured image, and a detection region modification means reduces the size of the detection region in the aerial view image in the inner side of rotation compared to a non-rotating state, in a range that includes a region that is close to the host vehicle, when the state detection means Rotation Detects that the host vehicle is in the rotating state.
[0002]
2. Driving assistance device according to claim 1, CHARACTERIZED by the fact that it further comprises a viewpoint conversion means that converts a plurality of the captured images, captured by the image capture means, into an image of aerial view by the viewpoint conversion means, wherein the three-dimensional object detection means detects, among the difference images corresponding to the time difference between the plurality of aerial view images, a three-dimensional object within the detection region based on the difference images within the detection region.
[0003]
3. Driving assistance device, according to claim 1 or 2, CHARACTERIZED by the fact that the detection region modification means reduces the length of the region of the detection region as the steering angle of the host vehicle increases.
[0004]
4. Driving assistance device according to any one of claims 1 to 3, CHARACTERIZED by the fact that the detection region modification means changes the shape or area of the detection region so that the degree of alteration of the region length of sensing region that corresponds to the inner side of the curve becomes greater than the region length of sensing region that corresponds to the outer side of the curve.
[0005]
5. Driving assistance device, CHARACTERIZED in that it comprises a rotational state detection means that detects a rotational state of a host vehicle, an image capture means disposed in the host vehicle that captures an image of a predetermined lane, which includes a sensing region corresponding to adjacent lanes behind the host vehicle on the left and right in a non-rotating state, and sends a captured image, a three-dimensional object detection means that detects a vehicle trailing in the sensing region in a aerial view image obtained by converting the field of view of the captured image, and a detection region modification means that rotates and moves a detection region in the aerial view image on the inner side of the curve in a direction reversed to that of the direction of rotation of the host vehicle, when the rotation state detection means detects that the host vehicle is in the rotation state, wherein the means of modifying and detection region rotates and moves the detection region so that the rotation angle of the detection region corresponding to the inner side of the curve is greater than the rotation angle of the detection region corresponding to the outer side of the curve.
[0006]
6. Driving assistance device according to claim 5, CHARACTERIZED by the fact that the sensing region modifying means rotates and moves the sensing region such that the angle of rotation of the sensing region with respect to a centerline in a host vehicle's direction of travel increases as the host vehicle's steering angle increases.
[0007]
7. Driving assistance device according to any one of claims 1 to 6, CHARACTERIZED by the fact that the driving assistance device has a rotation state prediction means that detects a state of rotation of the host vehicle as a state predicted rotation; and the detection region changing means detects the probability of the predicted rotation state based on a rotation state of the host vehicle detected by the rotation state detection means and the predicted rotation state predicted by the rotation state prediction means. rotation, when the probability of the predicted rotation state is equal to or greater than a predetermined value, the position of the prediction region relative to the host vehicle, or the area of the detection region is changed based on the predicted rotation state; and when the probability of the predicted rotational state is less than the predetermined value, the position of the detection region with respect to the host vehicle or the area of the detection region is changed based on the rotational state of the host vehicle detected by the means of rotation state detection.
[0008]
8. Driving assistance device, according to claim 7, CHARACTERIZED by the fact that the rotation state prediction means includes at least one of: a prediction means that predicts the rotation state of the host vehicle based on a driver operating state, a prediction means that predicts the rotational state of a host vehicle based on an image send captured from the image capture means or an image capture means disposed on a front part of the vehicle host, and a prediction means that calculates the rotational state of a host vehicle based on the position of the host vehicle and map information in which the road information is associated with the position information.
[0009]
9. Driving assistance device according to any one of claims 1 to 8, CHARACTERIZED by the fact that the rotation state detection means has a return direction amount detection means which detects the return direction amount in a steering wheel return direction operation, the sensing region modifying means alters the position of the sensing region relative to the host vehicle, or the sensing region area from an initial state of the sensing region when it is determined that the host vehicle is in a rotating state, and then returns the position of the sensing region relative to the host vehicle, or the area of the sensing region to the initial state when the amount of wheel return direction of direction is detected by the return direction quantity detection means; and the detection region modifying means adjusts the speed at which the position of the detection region relative to the host vehicle, or the area of the detection region is returned to the initial state when the amount of steering wheel return direction is detected to be at a slower speed compared to the speed at which the position of the sensing region relative to the host vehicle or the area of the sensing region is changed from an initial state of the sensing region when it is determined that the host vehicle is in a state of rotation.
[0010]
10. Driving assistance device, according to claim 9, CHARACTERIZED by the fact that the detection region modification means determines the speed at which the position of the detection region in relation to the host vehicle or the area of the region sensing is returned to the initial state when the steering wheel return direction quantity is detected by the return direction quantity detecting means, after the position of the sensing region relative to the host vehicle or the sensing region area is changed from an initial state of the sensing region when it is determined that the host vehicle is in a rotating state, and returns the position of the sensing region relative to the host vehicle or area of the sensing region to the initial state based at the determined speed.
[0011]
11. Driving assistance device, according to claim 10, CHARACTERIZED by the fact that the detection region modification means reduces the speed at which the position of the detection region in relation to the host vehicle or the area of the region of detection is returned to the initial state as the absolute value of the steering wheel return direction amount increases.
[0012]
12. Driving assistance device according to any one of claims 1 to 11, CHARACTERIZED by the fact that a lane change intention detection means that detects an intention to change lane; wherein the detection region modifying means, when the lane change intention detecting means detects an intention to change lane, and the host vehicle transitions from a rotating state to a non-rotating state, and the detection region is returned to the initial state, adjust the speed at which the position of the detection region in relation to the host vehicle, or the area of the detection region is returned to the initial state when being slower than the speed at which the position of the region sensing relative to the host vehicle or the area of the sensing region is changed from an initial state of the sensing region when it is determined that the host vehicle is in a rotating state.
[0013]
13. Driving assistance method, CHARACTERIZED by the fact that a driving assistance device is used, comprising an imaging means, a rotation state detection means, a detection region modification means and a detection means of three-dimensional object, the image capture means captures an image of a predetermined range, which includes a sensing region that corresponds to adjacent lanes behind the host vehicle on the left and right in a non-rotating state, and the object detection means three-dimensional detects a rear vehicle in the detection region in an aerial view image obtained by converting the field of view of the captured image comprising: rotation state detection means which detects a rotation state of the host vehicle based on the steering angle of the host vehicle, and the detection region modification means reduces the size of the detection region in the aerial view image on the inner side of the comparison curve. o to a non-rotating state, in a lane that includes a region that is close to the host vehicle, when the host vehicle is detected to be in the rotating state.
类似技术:
公开号 | 公开日 | 专利标题
BR112014001155B1|2021-06-01|DRIVING ASSISTANCE DEVICE AND DRIVING ASSISTANCE METHOD
KR102029562B1|2019-10-07|Target path generation device and travel control device
JP6637400B2|2020-01-29|Vehicle control device
JP6800168B2|2020-12-16|How to estimate the driving lane
JP5385009B2|2014-01-08|Vehicle control apparatus and vehicle control method
JP6323473B2|2018-05-16|Travel control device
JP6315107B2|2018-04-25|Target route generation device and travel control device
RU2735720C1|2020-11-06|Method of estimating a vehicle, a method for correcting a route, a vehicle evaluation device and a route correction device
JP2017159879A|2017-09-14|Vehicle position recognition system
BR102018068785A2|2019-06-04|VEHICLE CONTROL DEVICE
US10705530B2|2020-07-07|Vehicle travel control method and vehicle travel control device
BR112019027564A2|2020-07-07|vehicle information storage method, vehicle travel control method, and vehicle information storage device
BR112020004099A2|2020-09-24|position correction method and position error correction device for driving-aided vehicles
BR102018072683A2|2019-06-04|VEHICLE LOCATION DEVICE
JPWO2018173403A1|2019-11-07|Vehicle control apparatus and vehicle control method
BR112019004582A2|2019-06-11|vehicle displacement control method and displacement control device
BR112019022005A2|2020-05-12|TRAVEL ASSISTANCE METHOD AND TRAVEL ASSISTANCE DEVICE
JP6288305B2|2018-03-07|Target vehicle speed generation device and travel control device
JP6690510B2|2020-04-28|Object recognition device
JP6232883B2|2017-11-22|Own vehicle position recognition device
JP2015069287A|2015-04-13|Own vehicle position recognition device
JP2021049969A|2021-04-01|Systems and methods for calibrating steering wheel neutral position
BR112020001645A2|2020-07-21|travel assistance method and travel assistance device
US20210349463A1|2021-11-11|Traveling Trajectory Correction Method, Traveling Control Method, and Traveling Trajectory Correction Device
BR112020000078A2|2020-07-07|vehicle target speed generation method and vehicle target speed generation device for power steering vehicles
同族专利:
公开号 | 公开日
MX2014000649A|2014-04-30|
CN103718225B|2016-06-01|
RU2014107873A|2015-09-10|
RU2576362C2|2016-02-27|
WO2013018537A1|2013-02-07|
EP2741270B1|2020-11-25|
CN103718225A|2014-04-09|
EP2741270A4|2015-06-17|
EP2741270A1|2014-06-11|
BR112014001155A2|2019-12-17|
JPWO2013018537A1|2015-03-05|
MY180606A|2020-12-03|
US20140169630A1|2014-06-19|
US9235767B2|2016-01-12|
JP5862670B2|2016-02-16|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JPS55140629A|1979-04-16|1980-11-04|Hitachi Ltd|Approaching-object detector at both sides of car|
US5461357A|1992-01-29|1995-10-24|Mazda Motor Corporation|Obstacle detection device for vehicle|
JP3077529B2|1994-10-07|2000-08-14|三菱自動車工業株式会社|Obstacle warning device for vehicles|
JP3262001B2|1996-12-09|2002-03-04|三菱自動車工業株式会社|Rear side alarm system for vehicles|
JP2000214256A|1999-01-28|2000-08-04|Mazda Motor Corp|Display device of vehicle|
DE19921449C1|1999-05-08|2001-01-25|Daimler Chrysler Ag|Guide assistance when changing the lane of a motor vehicle|
JP3729005B2|1999-12-24|2005-12-21|三菱自動車工業株式会社|Vehicle rear monitoring device|
JP3575364B2|1999-12-28|2004-10-13|株式会社豊田自動織機|Steering support device|
EP1471482B1|2000-02-29|2009-02-11|Panasonic Corporation|Image pickup system and vehicle-mounted-type sensor system|
JP2003123196A|2001-10-10|2003-04-25|Denso Corp|Device and program for monitoring circumference of vehicle|
EP1504276B1|2002-05-03|2012-08-08|Donnelly Corporation|Object detection system for vehicle|
JP2005205983A|2004-01-21|2005-08-04|Aisan Ind Co Ltd|Apparatus for visually recognizing surrounding of own vehicle|
JP2005223524A|2004-02-04|2005-08-18|Nissan Motor Co Ltd|Supervisory apparatus for surrounding of vehicle|
JP3897305B2|2004-02-06|2007-03-22|シャープ株式会社|Vehicle periphery monitoring device, vehicle periphery monitoring method, control program, and readable recording medium|
JP4134939B2|2004-04-22|2008-08-20|株式会社デンソー|Vehicle periphery display control device|
JP4596978B2|2005-03-09|2010-12-15|三洋電機株式会社|Driving support system|
JP4907883B2|2005-03-09|2012-04-04|株式会社東芝|Vehicle periphery image display device and vehicle periphery image display method|
JP2007147458A|2005-11-28|2007-06-14|Fujitsu Ltd|Location detector, location detection method, location detection program, and recording medium|
JP2007221199A|2006-02-14|2007-08-30|Auto Network Gijutsu Kenkyusho:Kk|On-vehicle camera display device and image processing unit|
JP5020621B2|2006-12-18|2012-09-05|クラリオン株式会社|Driving assistance device|
JP2008219063A|2007-02-28|2008-09-18|Sanyo Electric Co Ltd|Apparatus and method for monitoring vehicle's surrounding|
JP2008227646A|2007-03-09|2008-09-25|Clarion Co Ltd|Obstacle detector|
JP2008277646A|2007-05-02|2008-11-13|Epson Imaging Devices Corp|Substrate for electrooptical device, mounting structure, and electronic equipment|
US7914187B2|2007-07-12|2011-03-29|Magna Electronics Inc.|Automatic lighting system with adaptive alignment function|
JP5347257B2|2007-09-26|2013-11-20|日産自動車株式会社|Vehicle periphery monitoring device and video display method|
JP4853444B2|2007-09-28|2012-01-11|株式会社デンソー|Moving object detection device|
JP2009129001A|2007-11-20|2009-06-11|Sanyo Electric Co Ltd|Operation support system, vehicle, and method for estimating three-dimensional object area|
JP5108605B2|2008-04-23|2012-12-26|三洋電機株式会社|Driving support system and vehicle|
US8830319B2|2008-04-29|2014-09-09|Magna Electronics Europe Gmbh & Co. Kg|Device and method for detecting and displaying the rear and/or side view of a motor vehicle|
JP2010221863A|2009-03-24|2010-10-07|Sanyo Electric Co Ltd|Vehicle surrounding monitoring device|
WO2010119329A2|2009-04-15|2010-10-21|Toyota Jidosha Kabushiki Kaisha|Object detection device|
JP2010256995A|2009-04-21|2010-11-11|Daihatsu Motor Co Ltd|Object recognition apparatus|
JP5031801B2|2009-07-28|2012-09-26|日立オートモティブシステムズ株式会社|In-vehicle image display device|
JP5634046B2|2009-09-25|2014-12-03|クラリオン株式会社|Sensor controller, navigation device, and sensor control method|
JP5362770B2|2011-05-19|2013-12-11|本田技研工業株式会社|Driving assistance device|US10451428B2|2013-03-15|2019-10-22|Volkswagen Aktiengesellschaft|Automatic driving route planning application|
DE102013216994A1|2013-08-27|2015-03-05|Robert Bosch Gmbh|Speed assistant for a motor vehicle|
SE540270C2|2014-04-01|2018-05-22|Scania Cv Ab|Procedure and system for risk assessment of lane change when driving a conductive vehicle on a roadway with at least two adjacent lanes|
SE540272C2|2014-04-01|2018-05-22|Scania Cv Ab|Procedure and system for risk assessment of lane change when driving a conductive vehicle on a roadway with at least two adjacent lanes|
SE540271C2|2014-04-01|2018-05-22|Scania Cv Ab|Procedure for risk assessment of lane change when driving a conductive vehicle on a roadway with at least two adjacent lanes|
US20150286878A1|2014-04-08|2015-10-08|Bendix Commercial Vehicle Systems Llc|Generating an Image of the Surroundings of an Articulated Vehicle|
JP6363393B2|2014-05-21|2018-07-25|トヨタ自動車株式会社|Vehicle periphery monitoring device|
JP2016018295A|2014-07-07|2016-02-01|日立オートモティブシステムズ株式会社|Information processing system|
CN104269070B|2014-08-20|2017-05-17|东风汽车公司|Active vehicle safety pre-warning method and safety pre-warning system with same applied|
JP6132359B2|2014-10-20|2017-05-24|株式会社Soken|Traveling line recognition device|
KR101637716B1|2014-11-03|2016-07-07|현대자동차주식회사|Apparatus and method for recognizing position of obstacle in vehicle|
JP6321532B2|2014-11-28|2018-05-09|株式会社デンソー|Vehicle travel control device|
DE102014117830A1|2014-12-04|2016-06-09|Valeo Schalter Und Sensoren Gmbh|Method for determining a driver-specific blind spot field for a driver assistance system, driver assistance system and motor vehicle|
EP3324383B1|2015-07-13|2021-05-19|Nissan Motor Co., Ltd.|Traffic light recognition device and traffic light recognition method|
JP6385907B2|2015-09-25|2018-09-05|日立オートモティブシステムズ株式会社|Three-dimensional object detection device|
WO2017061230A1|2015-10-08|2017-04-13|日産自動車株式会社|Display assistance device and display assistance method|
JP6515773B2|2015-10-09|2019-05-22|株式会社デンソー|Information processing device|
DE102015226840A1|2015-11-03|2017-05-04|Robert Bosch Gmbh|Method for operating a longitudinal control device of a motor vehicle in a roundabout|
EP3410417B1|2016-01-29|2020-07-08|Nissan Motor Co., Ltd.|Vehicle travel control method and vehicle travel control device|
EP3435352B1|2016-03-24|2021-11-17|Nissan Motor Co., Ltd.|Travel path detection method and travel path detection device|
CN109804421A|2016-10-07|2019-05-24|日产自动车株式会社|Vehicle judgment method, driving path modification method, vehicle judgment means and driving path correcting device|
JP6654544B2|2016-10-21|2020-02-26|株式会社Soken|Sensor control device|
US10399564B2|2016-10-25|2019-09-03|Ford Global Technologies, Llc|Vehicle roundabout management|
US20180152628A1|2016-11-30|2018-05-31|Waymo Llc|Camera peek into turn|
KR20180070402A|2016-12-16|2018-06-26|현대자동차주식회사|Apparatus and method for collision controlling of vehicle based on boundary|
JP6927787B2|2017-07-25|2021-09-01|矢崎エナジーシステム株式会社|On-board unit and driving support device|
RU2738491C1|2017-08-30|2020-12-14|Ниссан Мотор Ко., Лтд.|Method for correction of position error and device for correction of position error in vehicle with driving assistance|
EP3686863A4|2017-09-20|2020-10-28|Nissan Motor Co., Ltd.|Method for learning travel characteristics, and travel assistance device|
CN108692730B|2018-05-21|2021-08-10|同济大学|Pedestrian steering identification method applied to inertial navigation|
法律状态:
2019-04-30| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-05| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2019-12-03| B06I| Publication of requirement cancelled [chapter 6.9 patent gazette]|Free format text: ANULADA A PUBLICACAO CODIGO 6.21 NA RPI NO 2548 DE 05/11/2019 POR TER SIDO INDEVIDA. |
2020-02-11| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-04-27| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-06-01| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/07/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
JP2011-168895|2011-08-02|
JP2011168895|2011-08-02|
PCT/JP2012/068109|WO2013018537A1|2011-08-02|2012-07-17|Driving assistance apparatus and driving assistance method|
[返回顶部]